ICA Using Spacings Estimates of Entropy

نویسندگان

  • Erik G. Learned-Miller
  • John W. Fisher
چکیده

This paper presents a new algorithm for the independent components analysis (ICA) problem based on efficient spacings estimates of entropy. Like many previous methods, we minimize a standard measure of the departure from independence, the estimated Kullback-Leibler divergence between a joint distribution and the product of its marginals. To do this, we use a consistent and rapidly converging entropy estimator due to Vasicek. The resulting algorithm is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator and the resulting algorithm exhibit excellent robustness to outliers. We present favorable comparisons to Kernel ICA, FAST-ICA, JADE, and extended Infomax in extensive simulations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis

At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While ...

متن کامل

Hyperspacings and the Estimation of Information Theoretic Quantities

The estimation of probability densities from data is widely used as an intermediate step in the estimation of entropy, Kullback-Leibler (KL) divergence, and mutual information, and for statistical tasks such as hypothesis testing. We propose an alternative to density estimation– partitioning a space into regions whose approximate probability mass is known–that can be used for the same purposes....

متن کامل

Minimax Mutual Information Approach for ICA of Complex-Valued Linear Mixtures

Recently, the authors developed the Minimax Mutual Information algorithm for linear ICA of real-valued mixtures, which is based on a density estimate stemming from Jaynes’ maximum entropy principle. Since the entropy estimates result in an approximate upper bound for the actual mutual information of the separated outputs, minimizing this upper bound results in a robust performance and good gene...

متن کامل

معرفی الگوریتم جدید LICAD برای حل مشکل جایگشت محلی الگوریتم ICA

We present the new LICAD algorithm to solve the permutation problem of the ICA in the frequency domain and improve the separation quality. In the proposed algorithm, first, the sources' angles are estimated in each frequency bin using an ICA separating matrix. Then, these estimates are compared to the true values obtained from a pre-processing stage. If the difference among similar angles is le...

متن کامل

Exploring transient transfer entropy based on a group-wise ICA decomposition of EEG data

This paper presents a data-driven pipeline for studying asymmetries in mutual interdependencies between distinct components of EEG signal. Due to volume conductance, estimating coherence between scalp electrodes may lead to spurious results. A group-based independent component analysis (ICA), which is conducted across all subjects and conditions simultaneously, is an alternative representation ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 4  شماره 

صفحات  -

تاریخ انتشار 2003